PCA in Autocorrelation Space
نویسندگان
چکیده
The use of higher order autocorrelations as features for pattern classification has been usually restricted to second or third orders due to high computational costs. Since the autocorrelation space is a high dimensional space we are interested in reducing the dimensionality of feature vectors for the benefit of the pattern classification task. An established technique is Principal Component Analysis (PCA) which, however, cannot be applied directly in the autocorrelation space. In this paper we develop a new method for performing PCA in autocorrelation space, without explicitly computing the autocorrelations. The connections with the nonlinear PCA and possible extensions are also discussed.
منابع مشابه
A Kernel Version of Spatial Factor Analysis
Based on work by Pearson [1] in 1901, Hotelling [2] in 1933 introduced principal component analysis (PCA). PCA is often used for general feature generation and linear orthogonalization or compression by dimensionality reduction of correlated multivariate data, see Jolliffe [3] for a comprehensive description of PCA and related techniques. An interesting dilemma in reduction of dimensionality of...
متن کاملRecognizing Faces using Kernel Eigenfaces and Support Vector Machines
In face recognition, Principal Component Analysis (PCA) is often used to extract a low dimensional face representation based on the eigenvector of the face image autocorrelation matrix. Kernel Principal Component Analysis (Kernel PCA) has recently been proposed as a non-linear extension of PCA. While PCA is able to discover and represent linearly embedded manifolds, Kernel PCA can extract low d...
متن کاملStatistical shape analysis using non-Euclidean metrics
The contribution of this paper is the adaptation of data driven methods for non-Euclidean metric decomposition of tangent space shape coordinates. The basic idea is to extend principal component analysis (PCA) to take into account the noise variance at different landmarks and at different shapes. We show examples where these non-Euclidean metric methods allow for easier interpretation by decomp...
متن کاملScale Invariant Texture Analysis Using Multi-scale Local Autocorrelation Features
We have developed a new framework for scale invariant texture analysis using multi-scale local autocorrelation features. The multiscale features are made of concatenated feature vectors of different scales, which are calculated from higher-order local autocorrelation functions. To classify different types of textures among the given test images, a linear discriminant classifier (LDA) is employe...
متن کاملRepresenting Spectral data using LabPQR color space in comparison to PCA method
In many applications of color technology such as spectral color reproduction it is of interest to represent the spectral data with lower dimensions than spectral space’s dimensions. It is more than half of a century that Principal Component Analysis PCA method has been applied to find the number of independent basis vectors of spectral dataset and representing spectral reflectance with lower di...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002